Approximate Computing for Long Short Term Memory (LSTM) Neural Networks
نویسندگان
چکیده
منابع مشابه
Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks
Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources...
متن کاملStacked Long Short-term Memory Neural Networks
In this paper, we describe a novel approach to generate conference call-for-papers using Natural Language Processing and Long Short-Term Memory network. The approach has been successfully evaluated on a publicly available dataset.
متن کاملLong Short-Term Memory Neural Networks for Chinese Word Segmentation
Currently most of state-of-the-art methods for Chinese word segmentation are based on supervised learning, whose features aremostly extracted from a local context. Thesemethods cannot utilize the long distance information which is also crucial for word segmentation. In this paper, we propose a novel neural network model for Chinese word segmentation, which adopts the long short-term memory (LST...
متن کاملLong Short-Term Memory (LSTM) networks with jet constituents for boosted top tagging at the LHC
Multivariate techniques based on engineered features have found wide adoption in the identification of jets resulting from hadronic top decays at the Large Hadron Collider (LHC). Recent Deep Learning developments in this area include the treatment of the calorimeter activation as an image or supplying a list of jet constituent momenta to a fully connected network. This latter approach lends its...
متن کاملDialog state tracking using long short-term memory neural networks
Neural network based approaches have recently shown stateof-art performance in the Dialog State Tracking Challenge (DSTC). In DSTC, a tracker is used to assign a label to the state at each moment in an input sequence of a dialog. Specifically, deep neural networks (DNNs) and simple recurrent neural networks (RNNs) have significantly improved the performance of the dialog state tracking. In this...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
سال: 2018
ISSN: 0278-0070,1937-4151
DOI: 10.1109/tcad.2018.2858362